Weighted Gaussian entropy and determinant inequalities

نویسندگان

  • Yuri M. Suhov
  • Salimeh Yasaei Sekeh
  • Izabella Stuhl
چکیده

We produce a series of results extending information-theoretical inequalities (discussed by Dembo–Cover–Thomas in 1989-1991) to a weighted version of entropy. The resulting inequalities involve the Gaussian weighted entropy; they imply a number of new relations for determinants of positive-definite matrices.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Extended inequalities for weighted Renyi entropy involving generalized Gaussian densities

In this paper the author analyzes the weighted Renyi entropy in order to derive several inequalities in weighted case. Furthermore, using the proposed notions α-th generalized deviation and (α, p)-th weighted Fisher information, extended versions of the moment-entropy, Fisher information and Cramér-Rao inequalities in terms of generalized Gaussian densities are given. 1 The weighted p-Renyi ent...

متن کامل

On relative weighted entropies with central moments weight functions

The definition and a number of inequalities for a standard DE were illustrated in [9, 3, 7]. Furthermore, in [1, 6] the initial definition and results on weighted entropy was introduced. Following [10, 5, 8, 11], recently in [4, 2, 13, 14, 15], a similar method with standard DE drives to emerge certain properties and applications of information-theoretical weighted entropies with a number of de...

متن کامل

Information theoretic inequalities

The role of inequalities in information theory is reviewed and the relationship of these inequalities to inequalities in other branches of mathematics is developed. Index Terms -Information inequalities, entropy power, Fisher information, uncertainty principles. I. PREFACE:~NEQUALITIES ININFORMATIONTHEORY I NEQUALITIES in information theory have been driven by a desire to solve communication th...

متن کامل

Cramér-Rao and moment-entropy inequalities for Renyi entropy and generalized Fisher information

The moment-entropy inequality shows that a continuous random variable with given second moment and maximal Shannon entropy must be Gaussian. Stam’s inequality shows that a continuous random variable with given Fisher information and minimal Shannon entropy must also be Gaussian. The CramérRao inequality is a direct consequence of these two inequalities. In this paper the inequalities above are ...

متن کامل

Gaussian mixtures: entropy and geometric inequalities

A symmetric random variable is called a Gaussian mixture if it has the same distribution as the product of two independent random variables, one being positive and the other a standard Gaussian random variable. Examples of Gaussian mixtures include random variables with densities proportional to e−|t| p and symmetric p-stable random variables, where p ∈ (0, 2]. We obtain various sharp moment an...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • CoRR

دوره abs/1505.01753  شماره 

صفحات  -

تاریخ انتشار 2015